Sharp Ritz Value Estimates for Restarted Krylov Subspace Iterations
نویسنده
چکیده
Gradient iterations for the Rayleigh quotient are elemental methods for computing the smallest eigenvalues of a pair of symmetric and positive definite matrices. A considerable convergence acceleration can be achieved by preconditioning and by computing Rayleigh-Ritz approximations from subspaces of increasing dimensions. An example of the resulting Krylov subspace eigensolvers is the generalized Davidson method. Krylov subspace iterations can be restarted in order to limit their computer storage requirements. For the restarted Krylov subspace eigensolvers, a Chebyshev type convergence estimate was presented by Knyazev in [Russian J. Numer. Anal. Math. Modelling, 2:371-396, 1987]. This estimate has been generalized to arbitrary eigenvalue intervals in [SIAM J. Matrix Anal. Appl., 37(3):955-975, 2016]. The generalized Ritz value estimate is not sharp as it depends only on three eigenvalues. In the present paper, we extend the latter analysis by generalizing the geometric approach from [SIAM J. Matrix Anal. Appl., 32(2):443-456, 2011] in order to derive a sharp Ritz value estimate for restarted Krylov subspace iterations.
منابع مشابه
Convergence Analysis of Restarted Krylov Subspace Eigensolvers
The A-gradient minimization of the Rayleigh quotient allows to construct robust and fastconvergent eigensolvers for the generalized eigenvalue problem for (A,M) with symmetric and positive definite matrices. The A-gradient steepest descent iteration is the simplest case of more general restarted Krylov subspace iterations for the special case that all step-wise generated Krylov subspaces are tw...
متن کاملDeflated Restarting for Matrix Functions
We investigate an acceleration technique for restarted Krylov subspace methods for computing the action of a function of a large sparse matrix on a vector. Its effect is to ultimately deflate a specific invariant subspace of the matrix which most impedes the convergence of the restarted approximation process. An approximation to the subspace to be deflated is successively refined in the course ...
متن کاملAugmented Implicitly Restarted Lanczos Bidiagonalization Methods
New restarted Lanczos bidiagonalization methods for the computation of a few of the largest or smallest singular values of a large matrix are presented. Restarting is carried out by augmentation of Krylov subspaces that arise naturally in the standard Lanczos bidiagonalization method. The augmenting vectors are associated with certain Ritz or harmonic Ritz vectors. Computed examples show the ne...
متن کاملImplicitly Restarted Generalized Second-order Arnoldi Type Algorithms for the Quadratic Eigenvalue Problem
We investigate the generalized second-order Arnoldi (GSOAR) method, a generalization of the SOAR method proposed by Bai and Su [SIAM J. Matrix Anal. Appl., 26 (2005): 640–659.], and the Refined GSOAR (RGSOAR) method for the quadratic eigenvalue problem (QEP). The two methods use the GSOAR procedure to generate an orthonormal basis of a given generalized second-order Krylov subspace, and with su...
متن کاملImplicitly Restarted GMRES and Arnoldi Methods for Nonsymmetric Systems of Equations
The generalized minimum residual method (GMRES) is well known for solving large nonsymmetric systems of linear equations. It generally uses restarting, which slows the convergence. However, some information can be retained at the time of the restart and used in the next cycle. We present algorithms that use implicit restarting in order to retain this information. Approximate eigenvectors determ...
متن کامل